Skip to main content

Stochastic Process

If random variables represent a process that proceed randomly in time, then it's Stochastic Process.

  • A simple random wawlk can be thought as a model for repeated gambling

Theorem: Let XnX_n be a simple random walk and let n be a positive integer. If kk is an integer such that nkn-n \le k \le n and n+kn + k is even, then: P(Xn=a+k)=(nn+k2)pn+k2(1p)nk2P(X_n = a + k) = {n \choose \frac{n + k}{2}} p^{\frac{n + k}{2}} (1 - p)^{\frac{n - k}{2}} with the expection E[Xn]=a+n(2p1)E[X_n] = a + n(2p - 1).

Proof: Let a simple walk with a,ρa, \rho respectively represent start and probablity for winning.

  • Let WnW_n be the winning and LnL_n be the losing after n times, then n=Wn+Lnn = W_n + L_n
  • Let XnX_n be current, then Xn=a+WnLn    Xn+n=a+2WnX_n = a + W_n - L_n \implies X_n + n = a + 2W_n so that Wn=Xn+na2W_n = \frac{X_n + n - a}{2}
  • Then WnW_n is binomal distribution with nn trials and p=ρp = \rho

Gambler's Ruin: Let XnX_n be a simple random walk with some initial aa and some probability ρ\rho for winning. Let c>ac > a be some other integer. The gambler ruin question is: if you repeatedly bet $1, then what is the probability that you will reach a fortune of $c before you lose all your money by reaching a fortune of $0.

Counting Process

A stochastic process {N(t):t0}\{N(t): t \ge 0\} is said to be a counting process if N(t)N(t) presents the total number of events that have occurred up to time tt. Here a counting process must satisfy the following conditions:

  1. N(t)0N(t) \ge 0
  2. N(t)N(t) integer-valued
  3. s<t    N(s)N(t)s < t \implies N(s) \le N(t)
  4. s<t,N(t)N(s)\forall s < t, N(t) - N(s) equals the number of events that have occurred in the interval (s,t](s,t]

A counting process {N(t):t0}\{N(t): t \ge 0\} have an independent increments if the number of events that occur in disjoint time intervals are independent of each other.

A counting process {N(t):t0}\{N(t): t \ge 0\} have stationary increments if the distribution of the number of events only dpend on the length of the interval.

Poisson Process

A counting process {N(t):t0}\{N(t): t \ge 0\} is a Poisson process with rate λ\lambda if:

  • N(0)=0N(0) = 0
  • The process has independent increments and stationary increments
  • The number of events in any interval of length tt is Posson distributed N(t)Poi(λt)N(t) \sim Poi(\lambda t) with E(N(t))=λtE(N(t)) = \lambda t and Var(N(t))=λtVar(N(t)) = \lambda t

Or we can also use:

  • N(0)=0N(0) = 0
  • The process has independent and stationary increments
  • P(N(h)=1)=λh+o(h)P(N(h) = 1) = \lambda h + o(h)
    • h0    P(N(h)=1)λhh=0h\to 0 \implies \frac{P(N(h)=1) - \lambda h}{h} = 0
    • o(h)o(h) means h0    o(h)h=0h\to 0 \implies \frac{o(h)}{h} = 0
  • P(N(h)2)=o(h)P(N(h) \ge 2) = o(h)

Let the inter-arrival time of a Poisson process be Ri=R_i = the waitting time for the ith event after the (i1)th(i - 1)th event. Then RiExp(λ)R_i \sim Exp(\lambda) with E(Ri)=1λE(R_i) = \frac{1}{\lambda} and Var(Ri)=1λ2Var(R_i) = \frac{1}{\lambda^2}.

Conditional Poisson Process follows: N(s)N(t)Bin(n,s/t)N(s) | N(t) \sim Bin(n, s/t)

Brownian Motion

Let {Xn}\{X_n\} be a simple random walk with X0=0X_0 = 0 and let Xi=j=1iZiX_i = \sum_{j=1}^i Z_i where ZiZ_i i.i.d follows some distributions (i.e. the simple random walk, P(Zi=win)P(Z_i = win) and P(Zi=loss)P(Z_i = loss)). We define a new process {Yt(M)=1M(i=1tMZi)}\{Y_t^{(M)} = \frac{1}{\sqrt{M}}(\sum_{i = 1}^{tM} Z_i)\} where MM is a positive integer. Then {Bt}t0\{B_t\}_{t \ge 0} is a Brownian motion if it's the limit as MM \to \infty of {Yt(M)}\{Y_t^{(M)}\}. That is, Brownian motion has the following properties:

  • E[Yt(M)]=0E[Y_t^{(M)}] = 0
  • Var(Yt(M))=(1M)2tM=tVar(Y_t^{(M)}) = (\frac{1}{\sqrt{M}})^2 tM = t
  • M    Yt(M)N(0,t)M\to \infty \implies Y_t^{(M)} \sim N(0,t) by CLT, that is, BtN(0,t)B_t \sim N(0,t)

For some Brownian motion BtB_t and BsB_s where 0<t<s0 < t < s, we have BsBtN(0,st)B_s - B_t \sim N(0, s - t).

  • the covariance Cov(Bs,Bt)=min(s,t)Cov(B_s, B_t) = \min(s,t)

Then we can make a formal definition of Brownian motion: Brownian Motion is a continuous time process {Bt}t0\{B_t\}_{t \ge 0} with the following properties:

  • B0=0B_0 = 0
  • Normal distributed: BtN(0,t)B_t \sim N(0,t)
  • Independent normal increments: BsBtN(0,st)B_s - B_t \sim N(0, s - t)
  • Covariance structure: Cov(Bs,Bt)=min(s,t)Cov(B_s, B_t) = \min(s,t)
  • Continuous sample paths (the function tBtt \to B_t is continuous, but not differentible)
    • ideally, only when limh0Bt+hBth=0\lim_{h\to 0} \frac{B_{t+h} - B_t}{h} = 0 differentible, but then Bt+hBtN(0,h)B_{t + h} - B_t \sim N(0, h), so that 1hBt+hBtN(0,1h)\frac{1}{h} B_{t + h} - B_t \sim N(0, \frac{1}{h}) \to \infty as h0h \to 0, so that BtB_t is not differentible

Let BtB_t be Brownian motion and let Xt=a+δt+σBtX_t = a + \delta t + \sigma B_t be a diffusion, then:

  • E[Xt]=a+δtE[X_t] = a + \delta t
  • Var(Xt)=σ2tVar(X_t) = \sigma^2 t
  • XtN(a+δt,σ2t)X_t \sim N(a + \delta t, \sigma^2 t)
  • we call aa the drift, δ\delta the drift rate, and σ\sigma the volatility parameter

Martingale

If we have a Markov chain or stochastic process that stays the same on average, we call it Martingale.

  • Let X0,X1,X_0, X_1, \cdots be a Markov chain, the chain is a martingale if n0,E(Xn+1XnXn)=0\forall n \ge 0, E(X_{n+1} - X_n| X_n) = 0. (i.ejSjpij=i\sum_{j\in S} jp_{ij} = i)
  • Let a stochastic process {Zn:n1}\{Z_n: n\ge 1\} be a martingale if n1,E(Zn)<\forall n \ge 1, E(|Z_n|) < \infty and n1,E(Zn+1Z1,,Zn)=Zn\forall n \ge 1, E(Z_{n+1}|Z_1, \cdots, Z_n) = Z_n.
  • Similarly, any sequence {Xn}n=0\{X_n\}_{n=0}^{\infty} is a martingale if n0,E(Xn+1X0,,Xn)=Xn\forall n \ge 0, E(X_{n+1}|X_0, \cdots, X_n) = X_n.
  • Brownian motion is a martingale.

Let XnX_n be a stochastic process, and let TT be a random variable taking values in {0,1,2,,}\{0,1,2,\ldots,\}. then TT is a STOPPING TIME if m0\forall m \ge 0, the event {T=m}\{T = m\} is independent of the values of XnX_n for n>mn > m. That is when deciding whether or not T=mT = m, we are not allowed to look at the features of XnX_n for n>mn > m.

OPTIONAL STOPPING TIME THEOREM: Suppose {Xn}\{X_n\} is a martingale with X0=aX_0 = a and TT is a stopping time. E(XT)=aE(X_T) = a if either:

  • The martingale is bounded up to time TT, i.e., M>0,nT,XnM\forall M > 0, \forall n \le T, |X_n| \le M.
  • The stopping time is bounded, i.e. M>0,TM\forall M > 0, T \le M.